Quasi-newton Modifications to Snopt

نویسنده

  • KATHY JENSEN
چکیده

subject to c(x) ≥ 0, where f : R → R and c : R → R are twice continuously differentiable functions. Let g(x) denote the gradient of f(x), and J(x) the mxn Jacobian matrix of c(x), with rows the gradients of c(x). A point x∗ is feasible with respect to the constraints c(x) if c(x∗) ≥ 0, and infeasible if c(x∗) < 0. The constraint ci(x) is considered active at x if ci(x) = 0 and inactive if ci(x) > 0. Let A(x) be the submatrix of J(x) that includes only the gradients of the constraints active at x. Denote the nullspace matrix of A(x) by Z(x), so that the columns of Z(x) form a basis for the nullspace of A(x). Let the active set A(x) be the set of indices of those constraints active at x. x∗ is a local minimizer of NIP if it is feasible with respect to the constraints, and there exists a neighborhood of x∗ such that f(x∗) ≤ f(x) for all feasible points x in the neighborhood. If the inequality is strict for all feasible x 6= x∗, then x∗ is a strong local minimizer. Otherwise, x∗ is a weak local minimizer. Nonlinearly constrained problems have an abundance of diverse applications in engineering, finance, trajectory optimization, and many additional fields. The scope of scientific applications solved with nonlinearly constrained optimization software will only increase. No matter how well optimization algorithms perform, there will always be a demand for improvement as larger and more advanced mathematical models are developed. Nonlinearly constrained problems are considerably more difficult to solve than unconstrained or linearly constrained problems, partly due to the fact that obtaining and maintaining feasibility is itself an iterative procedure. Another difficulty results from the crucial curvature

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Numerical Study of Limited Memory BFGS

The application of quasi-Newton methods is widespread in numerical optimization. Independently of the application, the techniques used to update the BFGS matrices seem to play an important role in the performance of the overall method. In this paper we address precisely this issue. We compare two implementations of the limited memory BFGS method for large-scale unconstrained problems. They diie...

متن کامل

SNOPT : An SQP Algorithm for Large - Scale Constrained Optimization ∗ Philip

Sequential quadratic programming (SQP) methods have proved highly effective for solving constrained optimization problems with smooth nonlinear functions in the objective and constraints. Here we consider problems with general inequality constraints (linear and nonlinear). We assume that first derivatives are available and that the constraint gradients are sparse. Second derivatives are assumed...

متن کامل

SNOPT: An SQP Algorithm for Large-Scale Constrained Optimization

Sequential quadratic programming (SQP) methods have proved highly effective for solving constrained optimization problems with smooth nonlinear functions in the objective and constraints. Here we consider problems with general inequality constraints (linear and nonlinear). We assume that first derivatives are available and that the constraint gradients are sparse. We discuss an SQP algorithm th...

متن کامل

On attraction of linearly constrained Lagrangian methods and of stabilized and quasi-Newton SQP methods to critical multipliers

It has been previously demonstrated that in the case when a Lagrange multiplier associated to a given solution is not unique, Newton iterations [e.g., those of sequential quadratic programming (SQP)] have a tendency to converge to special multipliers, called critical multipliers (when such critical multipliers exist). This fact is of importance because critical multipliers violate the second-or...

متن کامل

Block Bfgs Methods

We introduce a quasi-Newton method with block updates called Block BFGS. We show that this method, performed with inexact Armijo-Wolfe line searches, converges globally and superlinearly under the same convexity assumptions as BFGS. We also show that Block BFGS is globally convergent to a stationary point when applied to non-convex functions with bounded Hessian, and discuss other modifications...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006